Cross-modal information integration in category learning.

نویسندگان

  • J David Smith
  • Jennifer J R Johnston
  • Robert D Musgrave
  • Alexandria C Zakrzewski
  • Joseph Boomer
  • Barbara A Church
  • F Gregory Ashby
چکیده

An influential theoretical perspective describes an implicit category-learning system that associates regions of perceptual space with response outputs by integrating information preattentionally and predecisionally across multiple stimulus dimensions. In this study, we tested whether this kind of implicit, information-integration category learning is possible across stimulus dimensions lying in different sensory modalities. Humans learned categories composed of conjoint visual-auditory category exemplars comprising a visual component (rectangles varying in the density of contained lit pixels) and an auditory component (in Exp. 1, auditory sequences varying in duration; in Exp. 2, pure tones varying in pitch). The categories had either a one-dimensional, rule-based solution or a two-dimensional, information-integration solution. Humans could solve the information-integration category tasks by integrating information across two stimulus modalities. The results demonstrated an important cross-modal form of sensory integration in the service of category learning, and they advance the field's knowledge about the sensory organization of systems for categorization.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Time- but not sleep-dependent consolidation promotes the emergence of cross-modal conceptual representations

Conceptual knowledge about objects comprises a diverse set of multi-modal and generalisable information, which allows us to bring meaning to the stimuli in our environment. The formation of conceptual representations requires two key computational challenges: integrating information from different sensory modalities and abstracting statistical regularities across exemplars. Although these proce...

متن کامل

Network Video Online Semi-supervised Classification Algorithm Based on Multiple View Co-training

As information integration based on multiple modal has to problems like complexity calculation process and low classification accuracy towards network video classification algorithm, came up with a network video online semi-supervised classification algorithm based on multiple view co-training. According to extract the features in text view and visual view, to the feature vector in each view, u...

متن کامل

It does belong together: cross-modal correspondences influence cross-modal integration during perceptual learning

Experiencing a stimulus in one sensory modality is often associated with an experience in another sensory modality. For instance, seeing a lemon might produce a sensation of sourness. This might indicate some kind of cross-modal correspondence between vision and gustation. The aim of the current study was to explore whether such cross-modal correspondences influence cross-modal integration duri...

متن کامل

Category Training Induces Cross-modal Object Representations in the Adult Human Brain

The formation of cross-modal object representations was investigated using a novel paradigm that was previously successful in establishing unimodal visual category learning in monkeys and humans. The stimulus set consisted of six categories of bird shapes and sounds that were morphed to create different exemplars of each category. Subjects learned new cross-modal bird categories using a one-bac...

متن کامل

Image-Text Multi-Modal Representation Learning by Adversarial Backpropagation

We present novel method for image-text multi-modal representation learning. In our knowledge, this work is the first approach of applying adversarial learning concept to multi-modal learning and not exploiting image-text pair information to learn multi-modal feature. We only use category information in contrast with most previous methods using image-text pair information for multi-modal embeddi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Attention, perception & psychophysics

دوره 76 5  شماره 

صفحات  -

تاریخ انتشار 2014